# Domain-specific BERT
DA BERT Old News V1
MIT
The first transformer model trained on historical texts from Denmark's Absolute Monarchy period (1660-1849), developed by researchers at Aalborg University for processing historical texts with significant differences from modern Danish.
Large Language Model Other
D
CALDISS-AAU
48
2
Industry Bert Sec V0.1
Apache-2.0
A BERT sentence vector transformation model optimized for the financial and regulatory domain, trained on SEC documents
Text Embedding
Transformers

I
llmware
8,587
9
Km Bert
A pre-trained BERT model for Korean medical NLP, based on the KR-BERT architecture, pre-trained on a 116-million-word Korean medical corpus
Large Language Model
Transformers Korean

K
madatnlp
241
6
Agriculture Bert Base Chinese
This is a BERT model optimized for the agricultural domain, trained using MLM (Masked Language Model) self-supervised learning methods.
Large Language Model
Transformers Chinese

A
gigilin7
14
2
Legalbertpt Fp
Openrail
Legalbert-pt is a language model specialized for the Portuguese legal domain through pre-training, which can be fine-tuned for specific tasks.
Large Language Model
Transformers

L
raquelsilveira
738
6
Biomednlp BiomedBERT Base Uncased Abstract Fulltext
MIT
BiomedBERT is a biomedical domain-specific language model pretrained on PubMed abstracts and PubMedCentral full-text articles, achieving state-of-the-art performance in multiple biomedical NLP tasks.
Large Language Model English
B
microsoft
1.7M
240
Featured Recommended AI Models